AI Chatbots to Help with Mental Health Struggles
2024-04-02
LRC
TXT
大字
小字
滚动
全页
1From VOA Learning English, this is the Health & Lifestyle report.
2The mental health chatbot Earkick greets users with a friendly-looking panda that could fit easily in a children's program.
3When users talk about anxiety, the panda gives the kind of comforting statements that a trained mental health professional, called a therapist, would say.
4Then it might suggest breathing exercises or give advice on how to deal with stress.
5Earkick is one of hundreds of free chatbots aimed at dealing with a mental health crisis among young people.
6But the co-founder of Earkick, Karin Andrea Stephan, says he and the other creators do not "feel comfortable" calling their chatbots a therapy tool.
7Whether these chatbots, or apps, provide a simple self-help tool or mental health treatment is important to the growing digital health industry.
8Since the apps do not claim to diagnose or treat medical conditions, they do not need approval from the Food and Drug Administration (or FDA).
9The industry's position is now coming under more careful examination with recent developments of chatbots powered by artificial intelligence (AI).
10The technology uses a large amount of data to copy human language.
11The upsides are clear: the chatbots are free; they are available 24 hours a day; and people can use them in private.
12Now for the downsides: there is limited data that the chatbots improve mental health; and they have not received FDA approval to treat conditions like depression.
13Vaile Wright is a psychologist and technology director with the American Psychological Association.
14She said users of these chatbots, "have no way to know whether they're actually effective."
15Wright added that the chatbots are not the same as traditional mental health treatment.
16But, she said, they could help some people with less severe mental and emotional problems.
17Earkick's website states that the app does not "provide any form of medical care, medical opinion, diagnosis or treatment."
18Some health lawyers say such claims are not enough.
19Glenn Cohen of Harvard Law School said, "If you're really worried about people using your app for mental health services, you want a disclaimer that's more direct..." He suggested, "This is just for fun."
20Still, chatbots are already playing a role due to an ongoing shortage of mental health professionals.
21Britain's National Health Service has begun offering a chatbot called Wysa to help with stress, anxiety and depression among young people.
22This includes those people waiting to see a therapist. Some health insurers, universities, and hospitals in the United States are offering similar programs.
23Dr. Angela Skrzynski is a family doctor in the American state of New Jersey.
24When she tells her patients how long it will take to see a therapist, she says they are usually very open to trying a chatbot.
25Her employer, Virtua Health, offers Woebot to some adult patients.
26Founded in 2017 by a Stanford-trained psychologist, Woebot does not use AI programs.
27The chatbot uses thousands of structured language models written by its staff and researchers.
28Woebot founder Alison Darcy says this rules-based model is safer for health care use.
29The company is testing generative AI models, but Darcy says there have been problems with the technology.
30She said, "We couldn't stop the large language models from... telling someone how they should be thinking, instead of facilitating the person's process."
31Woebot's finding was included in a research paper on AI chatbots published last year in Digital Medicine.
32The writers concluded that chatbots could help with depression in a short time.
33But there was no way to study their long-term effect on mental health.
34Ross Koppel of the University of Pennsylvania studies health information technology.
35He worries these chatbots could be used in place of treatment and medications.
36Koppel and others would like to see the FDA review and possibly regulate these chatbots.
37Dr. Doug Opel works at Seattle Children's Hospital.
38He said, "There's a whole host of questions we need to understand about this technology so we can ultimately do what we're all here to do: improve kids' mental and physical health."
39And that's the Health & Lifestyle report. I'm Anna Matteo.
1From VOA Learning English, this is the Health & Lifestyle report. 2The mental health chatbot Earkick greets users with a friendly-looking panda that could fit easily in a children's program. 3When users talk about anxiety, the panda gives the kind of comforting statements that a trained mental health professional, called a therapist, would say. Then it might suggest breathing exercises or give advice on how to deal with stress. 4Earkick is one of hundreds of free chatbots aimed at dealing with a mental health crisis among young people. But the co-founder of Earkick, Karin Andrea Stephan, says he and the other creators do not "feel comfortable" calling their chatbots a therapy tool. 5Whether these chatbots, or apps, provide a simple self-help tool or mental health treatment is important to the growing digital health industry. Since the apps do not claim to diagnose or treat medical conditions, they do not need approval from the Food and Drug Administration (or FDA). 6The use of AI chatbots 7The industry's position is now coming under more careful examination with recent developments of chatbots powered by artificial intelligence (AI). The technology uses a large amount of data to copy human language. 8The upsides are clear: the chatbots are free; they are available 24 hours a day; and people can use them in private. 9Now for the downsides: there is limited data that the chatbots improve mental health; and they have not received FDA approval to treat conditions like depression. 10Vaile Wright is a psychologist and technology director with the American Psychological Association. She said users of these chatbots, "have no way to know whether they're actually effective." 11Wright added that the chatbots are not the same as traditional mental health treatment. But, she said, they could help some people with less severe mental and emotional problems. 12Earkick's website states that the app does not "provide any form of medical care, medical opinion, diagnosis or treatment." Some health lawyers say such claims are not enough. 13Glenn Cohen of Harvard Law School said, "If you're really worried about people using your app for mental health services, you want a disclaimer that's more direct..." He suggested, "This is just for fun." 14Still, chatbots are already playing a role due to an ongoing shortage of mental health professionals. 15Shortage of mental health professionals 16Britain's National Health Service has begun offering a chatbot called Wysa to help with stress, anxiety and depression among young people. 17This includes those people waiting to see a therapist. Some health insurers, universities, and hospitals in the United States are offering similar programs. 18Dr. Angela Skrzynski is a family doctor in the American state of New Jersey. When she tells her patients how long it will take to see a therapist, she says they are usually very open to trying a chatbot. Her employer, Virtua Health, offers Woebot to some adult patients. 19Founded in 2017 by a Stanford-trained psychologist, Woebot does not use AI programs. The chatbot uses thousands of structured language models written by its staff and researchers. 20Woebot founder Alison Darcy says this rules-based model is safer for health care use. The company is testing generative AI models, but Darcy says there have been problems with the technology. 21She said, "We couldn't stop the large language models from... telling someone how they should be thinking, instead of facilitating the person's process." 22Woebot's finding was included in a research paper on AI chatbots published last year in Digital Medicine. 23The writers concluded that chatbots could help with depression in a short time. But there was no way to study their long-term effect on mental health. 24Ross Koppel of the University of Pennsylvania studies health information technology. He worries these chatbots could be used in place of treatment and medications. Koppel and others would like to see the FDA review and possibly regulate these chatbots. 25Dr. Doug Opel works at Seattle Children's Hospital. He said, "There's a whole host of questions we need to understand about this technology so we can ultimately do what we're all here to do: improve kids' mental and physical health." 26And that's the Health & Lifestyle report. I'm Anna Matteo. 27Matthew Perrone reported this story for the Associated Press from Washington, D.C. Anna Matteo adapted it for VOA Learning English. 28__________________________________________________ 29Words in This Story 30chatbot - n. a computer program or character (as in a game) designed to mimic the actions of a person that is designed to converse with human beings 31anxiety - n. an abnormal and overwhelming sense of apprehension and fear often marked by physical signs 32diagnose - v. to recognize (something, such as a disease) by signs and symptoms 33artificial intelligence - n. the capability of computer systems or algorithms to imitate intelligent human behavior 34psychologist - n. a person who specializes in the study of mind and behavior or in the treatment of mental, emotional, and behavioral disorders 35diagnosis - n. the art or act of identifying a disease from its signs and symptoms 36facilitate - v. to help bring (something) about 37We want to hear from you. Do you have a similar expression in your language? In the Comments section, you can also practice using any of the expressions from the story. Our comment policy is here.